Generalized-Hukuhara-Gradient Efficient-Direction Method to Solve Optimization Problems with Interval-Valued Functions and Its Application in Least-Squares Problems
نویسندگان
چکیده
This article proposes a general gH-gradient efficient-direction method and $${\mathcal{W}}$$ -gH-gradient efficient for the optimization problems with interval-valued functions. The convergence analysis step-wise algorithms of both methods are presented. It is observed that converges linearly strongly convex objective function. To develop proposed to study their convergence, ideas strong convexity sequential criteria gH-continuity function illustrated. In sequel, new definition gH-differentiability functions also proposed. described help newly defined concept linear noticed superior existing ones. For gH-differentiable function, relation an optimality condition interval problem derived. derived condition, notion direction introduced. idea used gradient methods. As application methods, least-square data by solved. least square illustrated polynomial fitting logistic curve fitting.
منابع مشابه
On interval-valued optimization problems with generalized invex functions
*Correspondence: [email protected] 1Department of Mathematics and Statistics, King Fahd University of Petroleum and Minerals, Dhahran, 31261, Saudi Arabia 2Permanent address: Department of Mathematics, Aligarh Muslim University, Aligarh, 202 002, India Full list of author information is available at the end of the article Abstract This paper is devoted to study interval-valued optimization p...
متن کاملGradient methods and conic least-squares problems
This paper presents two reformulations of the dual of the constrained least squares problem over convex cones. In addition, it extends Nesterov’s excessive gap method 1 [21] to more general problems. The conic least squares problem is then solved by applying the resulting modified method, or Nesterov’s smooth method [22], or Nesterov’s excessive gap method 2 [21], to the dual reformulations. Nu...
متن کاملUsing Perturbed QR Factorizations to Solve Linear Least-Squares Problems
We propose and analyze a new tool to help solve sparse linear least-squares problems minx ‖Ax − b‖2. Our method is based on a sparse QR factorization of a low-rank perturbation Ä of A. More precisely, we show that the R factor of Ä is an effective preconditioner for the least-squares problem minx ‖Ax−b‖2, when solved using LSQR. We propose applications for the new technique. When A is rank defi...
متن کاملNumerical methods for generalized least squares problems
Usually generalized least squares problems are solved by transforming them into regular least squares problems which can then be solved by well-known numerical methods. However, this approach is not very effective in some cases and, besides, is very expensive for large scale problems. In 1979, Paige suggested another approach which consists of solving an equivalent equality-constrained least sq...
متن کاملAn Efficient Conjugate Gradient Algorithm for Unconstrained Optimization Problems
In this paper, an efficient conjugate gradient method for unconstrained optimization is introduced. Parameters of the method are obtained by solving an optimization problem, and using a variant of the modified secant condition. The new conjugate gradient parameter benefits from function information as well as gradient information in each iteration. The proposed method has global convergence und...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: International Journal of Fuzzy Systems
سال: 2021
ISSN: ['2199-3211', '1562-2479']
DOI: https://doi.org/10.1007/s40815-021-01175-x